77 research outputs found

    On the Statistics of Trustworthiness Prediction

    Get PDF
    Trust and trustworthiness facilitate interactions between human beings worldwide, every day. They enable the formation of friendships, making of profits and the adoption of new technologies, making life not only more pleasant, but furthering the societal development. Trust, for lack of a better word, is good. When human beings trust, they rely on the trusted party to be trustworthy, that is, literally worthy of the trust that is being placed in them. If it turns out that the trusted party is unworthy of the trust placed into it, the truster has misplaced its trust, has unwarrantedly relied and is liable to experience possibly unpleasant consequences. Human social evolution has equipped us with tools for determining another’s trustworthiness through experience, cues and observations with which we aim to minimise the risk of misplacing our trust. Social adaptation, however, is a slow process and the cues that are helpful in real, physical environments where we can observe and hear our interlocutors are less helpful in interactions that are conducted over data networks with other humans or computers, or even between two computers. This presents a challenge in a world where the virtual and the physical intermesh increasingly. A challenge that computational trust models seek to address by applying computational evidence-based methods to estimate trustworthiness. In this thesis, the state-of-the-art in evidence-based trust models is extended and improved upon – in particular with regard to their statistical modelling. The statistics behind (Bayesian) trustworthiness estimation will receive special attention, their extension bringing about improvements in trustworthiness estimation that encompass the fol- lowing aspects: (i.) statistically well-founded estimators for binomial and multinomial models of trust that can accurately estimate the trustworthiness of another party and those that can express the inher- ent uncertainty of the trustworthiness estimate in a statistically meaningful way, (ii.) better integration of recommendations by third parties using advanced methods for determining the reliability of the received recommendations, (iii.) improved responsiveness to changes in the behaviour of trusted parties, and (iv.) increasing the generalisability of trust-relevant information over a set of trusted parties. Novel estimators, methods for combining recommendations and other trust- relevant information, change detectors, as well as a mapping for integrating stereotype-based trustworthiness estimates, are bundled in an improved Bayesian trust model, Multinomial CertainTrust. Specific scientific contributions are structured into three distinct categories: 1. A Model for Trustworthiness Estimation: The statistics of trustworthiness estimation are investigated to design fully multinomial trustworthiness estimation model. Leveraging the assumptions behind the Bayesian estimation of binomial and multinomial proportions, accurate trustworthiness and certainty estimators are presented, and the integration of subjectivity via informed and non-informed Bayesian priors is discussed. 2. Methods for Trustworthiness Information Processing: Methods for facilitating trust propagation and accounting for concept drift in the behaviour of the trusted parties are introduced. All methods are applicable, by design, to both the binomial case and the multinomial case of trustworthiness estimation. 3. Further extension for trustworthiness estimation: Two methods for addressing the potential lack of direct experiences with new trustee in feedback-based trust models are presented. For one, the dedicated modelling of particular roles and the trust delegation between them is shown to be principally possible as an extension to existing feedback- based trust models. For another, a more general approach for feature-based generalisation using model-free, supervised machine-learners, is introduced. The general properties of the trustworthiness and certainty estimators are derived formally from the basic assumptions underlying binomial and multinomial estimation problems, harnessing fundamentals of Bayesian statistics. Desired properties for the introduced certainty estimators, first postulated by Wang & Singh, are shown to hold through formal argument. The general soundness and applicability of the proposed certainty estimators is founded on the statistical properties of interval estimation techniques discussed in the related statistics work and formally and rigorously shown there. The core estimation system and additional methods, in their entirety constituting the Multinomial CertainTrust model, are implemented in R, along with competing methods from the related work, specifically for determining recommender trustworthiness and coping with changing behaviour through ageing. The performance of the novel methods introduced in this thesis was tested against established methods from the related work in simulations. Methods for hardcoding indicators of trustworthiness were implemented within a multi-agent framework and shown to be functional in an agent-based simulation. Furthermore, supervised machine-learners were tested for their applicability by collecting a real-world data set of reputation data from a hotel booking site and evaluating their capabilities against this data set. The hotel data set exhibits properties, such as a high imbalance in the ratings, that appears typical of data that is generated from reputation systems, as these are also present in other data sets

    Automatic 3D modeling by combining SBFEM and transfinite element shape functions

    Full text link
    The scaled boundary finite element method (SBFEM) has recently been employed as an efficient means to model three-dimensional structures, in particular when the geometry is provided as a voxel-based image. To this end, an octree decomposition of the computational domain is deployed and each cubic cell is treated as an SBFEM subdomain. The surfaces of each subdomain are discretized in the finite element sense. We improve on this idea by combining the semi-analytical concept of the SBFEM with certain transition elements on the subdomains' surfaces. Thus, we avoid the triangulation of surfaces employed in previous works and consequently reduce the number of surface elements and degrees of freedom. In addition, these discretizations allow coupling elements of arbitrary order such that local p-refinement can be achieved straightforwardly

    Insights into the classification of small GTPases

    Get PDF
    In this study we used a Random Forest-based approach for an assignment of small guanosine triphosphate proteins (GTPases) to specific subgroups. Small GTPases represent an important functional group of proteins that serve as molecular switches in a wide range of fundamental cellular processes, including intracellular transport, movement and signaling events. These proteins have further gained a special emphasis in cancer research, because within the last decades a huge variety of small GTPases from different subgroups could be related to the development of all types of tumors. Using a random forest approach, we were able to identify the most important amino acid positions for the classification process within the small GTPases superfamily and its subgroups. These positions are in line with the results of earlier studies and have been shown to be the essential elements for the different functionalities of the GTPase families. Furthermore, we provide an accurate and reliable software tool (GTPasePred) to identify potential novel GTPases and demonstrate its application to genome sequences

    Plasma Disappearance Rate of Indocyanine Green for Determination of Liver Function in Three Different Models of Shock

    Get PDF
    The measurement of the liver function via the plasma disappearance rate of indocyanine green (PDRICG) is a sensitive bed-side tool in critical care. Yet, recent evidence has questioned the value of this method for hyperdynamic conditions. To evaluate this technique in different hemodynamic settings, we analyzed the PDRICG and corresponding pharmacokinetic models after endotoxemia or hemorrhagic shock in rats. Male anesthetized Sprague-Dawley rats underwent hemorrhage (mean arterial pressure 35 ± 5 mmHg, 90 min) and 2 h of reperfusion, or lipopolysaccharide (LPS) induced moderate or severe (1.0 vs. 10 mg/kg) endotoxemia for 6 h (each n = 6). Afterwards, PDRICG was measured, and pharmacokinetic models were analyzed using nonlinear mixed effects modeling (NONMEM®). Hemorrhagic shock resulted in a significant decrease of PDRICG, compared with sham controls, and a corresponding attenuation of the calculated ICG clearance in 1- and 2-compartment models, with the same log-likelihood. The induction of severe, but not moderate endotoxemia, led to a significant reduction of PDRICG. The calculated ICG blood clearance was reduced in 1-compartment models for both septic conditions. 2-compartment models performed with a significantly better log likelihood, and the calculated clearance of ICG did not correspond well with PDRICG in both LPS groups. 3-compartment models did not improve the log likelihood in any experiment. These results demonstrate that PDRICG correlates well with ICG clearance in 1- and 2-compartment models after hemorrhage. In endotoxemia, best described by a 2-compartment model, PDRICG may not truly reflect the ICG clearance

    High order transition elements: The xNy-element concept -- Part I: Statics

    Full text link
    Advanced transition elements are of utmost importance in many applications of the finite element method (FEM) where a local mesh refinement is required. Considering problems that exhibit singularities in the solution, an adaptive hp-refinement procedure must be applied. Even today, this is a very demanding task especially if only quadrilateral/hexahedral elements are deployed and consequently the hanging nodes problem is encountered. These element types, are, however, favored in computational mechanics due to the improved accuracy compared to triangular/tetrahedral elements. Therefore, we propose a compatible transition element - xNy-element - which provides the capability of coupling different element types. The adjacent elements can exhibit different element sizes, shape function types, and polynomial orders. Thus, it is possible to combine independently refined h- and p-meshes. The approach is based on the transfinite mapping concept and constitutes an extension/generalization of the pNh-element concept. By means of several numerical examples, the convergence behavior is investigated in detail, and the asymptotic rates of convergence are determined numerically. Overall, it is found that the proposed approach provides very promising results for local mesh refinement procedures.Comment: 51 pages, 44 figures, 4 table

    Impact of Working Memory Load on fMRI Resting State Pattern in Subsequent Resting Phases

    Get PDF
    BACKGROUND: The default-mode network (DMN) is a functional network with increasing relevance for psychiatric research, characterized by increased activation at rest and decreased activation during task performance. The degree of DMN deactivation during a cognitively demanding task depends on its difficulty. However, the relation of hemodynamic responses in the resting phase after a preceding cognitive challenge remains relatively unexplored. We test the hypothesis that the degree of activation of the DMN following cognitive challenge is influenced by the cognitive load of a preceding working-memory task. METHODOLOGY/PRINCIPAL FINDINGS: Twenty-five healthy subjects were investigated with functional MRI at 3 Tesla while performing a working-memory task with embedded short resting phases. Data were decomposed into statistically independent spatio-temporal components using Tensor Independent Component Analysis (TICA). The DMN was selected using a template-matching procedure. The spatial map contained rest-related activations in the medial frontal cortex, ventral anterior and posterior cingulate cortex. The time course of the DMN revealed increased activation at rest after 1-back and 2-back blocks compared to the activation after a 0-back block. CONCLUSION/SIGNIFICANCE: We present evidence that a cognitively challenging working-memory task is followed by greater activation of the DMN than a simple letter-matching task. This might be interpreted as a functional correlate of self-evaluation and reflection of the preceding task or as relocation of cerebral resources representing recovery from high cognitive demands. This finding is highly relevant for neuroimaging studies which include resting phases in cognitive tasks as stable baseline conditions. Further studies investigating the DMN should take possible interactions of tasks and subsequent resting phases into account

    In Silico Approaches and the Role of Ontologies in Aging Research

    Get PDF
    The 2013 Rostock Symposium on Systems Biology and Bioinformatics in Aging Research was again dedicated to dissecting the aging process using in silico means. A particular focus was on ontologies, as these are a key technology to systematically integrate heterogeneous information about the aging process. Related topics were databases and data integration. Other talks tackled modeling issues and applications, the latter including talks focussed on marker development and cellular stress as well as on diseases, in particular on diseases of kidney and skin

    Overview of the PALM model system 6.0

    Get PDF
    In this paper, we describe the PALM model system 6.0. PALM (formerly an abbreviation for Parallelized Large-eddy Simulation Model and now an independent name) is a Fortran-based code and has been applied for studying a variety of atmospheric and oceanic boundary layers for about 20 years. The model is optimized for use on massively parallel computer architectures. This is a follow-up paper to the PALM 4.0 model description in Maronga et al. (2015). During the last years, PALM has been significantly improved and now offers a variety of new components. In particular, much effort was made to enhance the model with components needed for applications in urban environments, like fully interactive land surface and radiation schemes, chemistry, and an indoor model. This paper serves as an overview paper of the PALM 6.0 model system and we describe its current model core. The individual components for urban applications, case studies, validation runs, and issues with suitable input data are presented and discussed in a series of companion papers in this special issue.Peer reviewe

    Improved Bevirimat resistance prediction by combination of structural and sequence-based classifiers

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Maturation inhibitors such as Bevirimat are a new class of antiretroviral drugs that hamper the cleavage of HIV-1 proteins into their functional active forms. They bind to these preproteins and inhibit their cleavage by the HIV-1 protease, resulting in non-functional virus particles. Nevertheless, there exist mutations in this region leading to resistance against Bevirimat. Highly specific and accurate tools to predict resistance to maturation inhibitors can help to identify patients, who might benefit from the usage of these new drugs.</p> <p>Results</p> <p>We tested several methods to improve Bevirimat resistance prediction in HIV-1. It turned out that combining structural and sequence-based information in classifier ensembles led to accurate and reliable predictions. Moreover, we were able to identify the most crucial regions for Bevirimat resistance computationally, which are in line with experimental results from other studies.</p> <p>Conclusions</p> <p>Our analysis demonstrated the use of machine learning techniques to predict HIV-1 resistance against maturation inhibitors such as Bevirimat. New maturation inhibitors are already under development and might enlarge the arsenal of antiretroviral drugs in the future. Thus, accurate prediction tools are very useful to enable a personalized therapy.</p
    • …
    corecore